Tsallis Mutual Information for Document Classification

نویسندگان

  • Màrius Vila
  • Anton Bardera
  • Miquel Feixas
  • Mateu Sbert
چکیده

Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registration.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Plant Classification in Images of Natural Scenes Using Segmentations Fusion

This paper presents a novel approach to automatic classifying and identifying of tree leaves using image segmentation fusion. With the development of mobile devices and remote access, automatic plant identification in images taken in natural scenes has received much attention. Image segmentation plays a key role in most plant identification methods, especially in complex background images. Wher...

متن کامل

Fundamental properties on Tsallis entropies

A chain rule and a subadditivity for the entropy of type β, which is one of the nonextensive (nonadditive) entropies, were derived by Z.Daróczy. In this paper, we study the further relations among Tsallis type entropies which are typical nonextensive entropies. We show some inequalities related to Tsallis entropies, especially the strong subadditivity for Tsallis type entropies and the subaddit...

متن کامل

Nonextensive Information Theoretic Kernels on Measures

Positive definite kernels on probability measures have been recently applied to classification problems involving text, images, and other types of structured data. Some of these kernels are related to classic information theoretic quantities, such as (Shannon’s) mutual information and the JensenShannon (JS) divergence. Meanwhile, there have been recent advances in nonextensive generalizations o...

متن کامل

Performance comparison of new nonparametric independent component analysis algorithm for different entropic indexes

Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...

متن کامل

Fast and accurate image registration using Tsallis entropy and simultaneous perturbation stochastic approximation - Electronics Letters

The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic approximation algorithm to register images. It is shown that Tsallis entropy can improve registration accuracy and speed of convergence, compared with Shannon entropy, in the calculation of mutual information. Simulation results show that the new algorithm achieves up to seven times faster conver...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 13  شماره 

صفحات  -

تاریخ انتشار 2011